17 research outputs found

    Pulse shape and voltage-dependent synchronization in spiking neuron networks

    Full text link
    Pulse-coupled spiking neural networks are a powerful tool to gain mechanistic insights into how neurons self-organize to produce coherent collective behavior. These networks use simple spiking neuron models, such as the θ\theta-neuron or the quadratic integrate-and-fire (QIF) neuron, that replicate the essential features of real neural dynamics. Interactions between neurons are modeled with infinitely narrow pulses, or spikes, rather than the more complex dynamics of real synapses. To make these networks biologically more plausible, it has been proposed that they must also account for the finite width of the pulses, which can have a significant impact on the network dynamics. However, the derivation and interpretation of these pulses is contradictory and the impact of the pulse shape on the network dynamics is largely unexplored. Here, I take a comprehensive approach to pulse-coupling in networks of QIF and θ\theta-neurons. I argue that narrow pulses activate voltage-dependent synaptic conductances and show how to implement them in QIF neurons such that their effect can last through the phase after the spike. Using an exact low-dimensional description for networks of globally coupled spiking neurons, I prove for instantaneous interactions that collective oscillations emerge due to an effective coupling through the mean voltage. I analyze the impact of the pulse shape by means of a family of smooth pulse functions with arbitrary finite width and symmetric or asymmetric shapes. For symmetric pulses, the resulting voltage-coupling is little effective in synchronizing neurons, but pulses that are slightly skewed to the phase after the spike readily generate collective oscillations. The results unveil a voltage-dependent spike synchronization mechanism in neural networks, which is facilitated by pulses of finite width and complementary to traditional synaptic transmission.Comment: 38 pages, 11 figure

    Ott-Antonsen attractiveness for parameter-dependent oscillatory systems

    Get PDF
    The Ott-Antonsen (OA) ansatz [Ott and Antonsen, Chaos 18, 037113 (2008); Chaos 19, 023117 (2009)] has been widely used to describe large systems of coupled phase oscillators. If the coupling is sinusoidal and if the phase dynamics does not depend on the specific oscillator, then the macroscopic behavior of the systems can be fully described by a low-dimensional dynamics. Does the corresponding manifold remain attractive when introducing an intrinsic dependence between an oscillator's phase and its dynamics by additional, oscillator specific parameters? To answer this, we extended the OA ansatz and proved that parameter-dependent oscillatory systems converge to the OA manifold given certain conditions. Our proof confirms recent numerical findings that already hinted at this convergence. Furthermore, we offer a thorough mathematical underpinning for networks of so-called theta neurons, where the OA ansatz has just been applied. In a final step, we extend our proof by allowing for time-dependent and multi-dimensional parameters as well as for network topologies other than global coupling. This renders the OA ansatz an excellent starting point for the analysis of a broad class of realistic settings

    Modeling phase synchronization of interacting neuronal populations:from phase reductions to collective behavior of oscillatory neural networks

    Get PDF
    Synchronous, coherent interaction is key for the functioning of our brain. The coordinated interplay between neurons and neural circuits allows to perceive, process and transmit information in the brain. As such, synchronization phenomena occur across all scales. The coordination of oscillatory activity between cortical regions is hypothesized to underlie the concept of phase synchronization. Accordingly, phase models have found their way into neuroscience. The concepts of neural synchrony and oscillations are introduced in Chapter 1 and linked to phase synchronization phenomena in oscillatory neural networks. Chapter 2 provides the necessary mathematical theory upon which a sound phase description builds. I outline phase reduction techniques to distill the phase dynamics from complex oscillatory networks. In Chapter 3 I apply them to networks of weakly coupled Brusselators and of Wilson-Cowan neural masses. Numerical and analytical approaches are compared against each other and their sensitivity to parameter regions and nonlinear coupling schemes is analysed. In Chapters 4 and 5 I investigate synchronization phenomena of complex phase oscillator networks. First, I study the effects of network-network interactions on the macroscopic dynamics when coupling two symmetric populations of phase oscillators. This setup is compared against a single network of oscillators whose frequencies are distributed according to a symmetric bimodal Lorentzian. Subsequently, I extend the applicability of the Ott-Antonsen ansatz to parameterdependent oscillatory systems. This allows for capturing the collective dynamics of coupled oscillators when additional parameters influence the individual dynamics. Chapter 6 draws the line to experimental data. The phase time series of resting state MEG data display large-scale brain activity at the edge of criticality. After reducing neurophysiological phase models from the underlying dynamics of Wilson-Cowan and Freeman neural masses, they are analyzed with respect to two complementary notions of critical dynamics. A general discussion and an outlook of future work are provided in the final Chapter 7

    Low-dimensional firing-rate dynamics for populations of renewal-type spiking neurons

    Full text link
    The macroscopic dynamics of large populations of neurons can be mathematically analyzed using low-dimensional firing-rate or neural-mass models. However, these models fail to capture spike synchronization effects of stochastic spiking neurons such as the non-stationary population response to rapidly changing stimuli. Here, we derive low-dimensional firing-rate models for homogeneous populations of general renewal-type neurons, including integrate-and-fire models driven by white noise. Renewal models account for neuronal refractoriness and spike synchronization dynamics. The derivation is based on an eigenmode expansion of the associated refractory density equation, which generalizes previous spectral methods for Fokker-Planck equations to arbitrary renewal models. We find a simple relation between the eigenvalues, which determine the characteristic time scales of the firing rate dynamics, and the Laplace transform of the interspike interval density or the survival function of the renewal process. Analytical expressions for the Laplace transforms are readily available for many renewal models including the leaky integrate-and-fire model. Retaining only the first eigenmode yields already an adequate low-dimensional approximation of the firing-rate dynamics that captures spike synchronization effects and fast transient dynamics at stimulus onset. We explicitly demonstrate the validity of our model for a large homogeneous population of Poisson neurons with absolute refractoriness, and other renewal models that admit an explicit analytical calculation of the eigenvalues. The here presented eigenmode expansion provides a systematic framework for novel firing-rate models in computational neuroscience based on spiking neuron dynamics with refractoriness.Comment: 24 pages, 7 figure

    First-order phase transitions in the Kuramoto model with compact bimodal frequency distributions

    Get PDF
    The Kuramoto model of a network of coupled phase oscillators exhibits a first-order phase transition when the distribution of natural frequencies has a finite flat region at its maximum. First-order phase transitions including hysteresis and bistability are also present if the frequency distribution of a single network is bimodal. In this study, we are interested in the interplay of these two configurations and analyze the Kuramoto model with compact bimodal frequency distributions in the continuum limit. As of yet, a rigorous analytic treatment has been elusive. By combining Kuramoto's self-consistency approach, Crawford's symmetry considerations, and exploiting the Ott-Antonsen ansatz applied to a family of rational distribution functions that converge towards the compact distribution, we derive a full bifurcation diagram for the system's order-parameter dynamics. We show that the route to synchronization always passes through a standing wave regime when the bimodal distribution is compounded by two unimodal distributions with compact support. This is in contrast to a possible transition across a region of bistability when the two compounding unimodal distributions have infinite support

    Mesoscopic description of hippocampal replay and metastability in spiking neural networks with short-term plasticity

    Get PDF
    Bottom-up models of functionally relevant patterns of neural activity provide an explicit link between neuronal dynamics and computation. A prime example of functional activity pattern is hippocampal replay, which is critical for memory consolidation. The switchings between replay events and a low-activity state in neural recordings suggests metastable neural circuit dynamics. As metastability has been attributed to noise and/or slow fatigue mechanisms, we propose a concise mesoscopic model which accounts for both. Crucially, our model is bottom-up: it is analytically derived from the dynamics of finite-size networks of Linear-Nonlinear Poisson neurons with short-term synaptic depression. As such, noise is explicitly linked to spike noise and network size, and fatigue is explicitly linked to synaptic dynamics. To derive the mesosocpic model, we first consider a homogeneous spiking neural network and follow the temporal coarse-graining approach of Gillespie ("chemical Langevin equation"), which can be naturally interpreted as a stochastic neural mass model. The Langevin equation is computationally inexpensive to simulate and enables a thorough study of metastable dynamics in classical setups (population spikes and Up-Down states dynamics) by means of phase-plane analysis. This stochastic neural mass model is the basic component of our mesoscopic model for replay. We show that our model faithfully captures the stochastic nature of individual replayed trajectories. Moreover, compared to the deterministic Romani-Tsodyks model of place cell dynamics, it exhibits a higher level of variability in terms of content, direction and timing of replay events, compatible with biological evidence and could be functionally desirable. This variability is the product of a new dynamical regime where metastability emerges from a complex interplay between finite-size fluctuations and local fatigue.Comment: 43 pages, 8 figure

    Equivalence of coupled networks and networks with multimodal frequency distributions:Conditions for the bimodal and trimodal case

    Get PDF
    Populations of oscillators can display a variety of synchronization patterns depending on the oscillators' intrinsic coupling and the coupling between them. We consider two coupled symmetric (sub)populations with unimodal frequency distributions. If internal and external coupling strengths are identical, a change of variables transforms the system into a single population of oscillators whose natural frequencies are bimodally distributed. Otherwise an additional bifurcation parameter Îş enters the dynamics. By using the Ott-Antonsen ansatz, we rigorously prove that Îş does not lead to new bifurcations, but that a symmetric two-coupled-population network and a network with a symmetric bimodal frequency distribution are topologically equivalent. Seeking for generalizations, we further analyze a symmetric trimodal network vis-Ă -vis three coupled symmetric unimodal populations. Here, however, the equivalence with respect to stability, dynamics, and bifurcations of the two systems no longer holds

    Exact finite-dimensional description for networks of globally coupled spiking neurons

    Full text link
    We consider large networks of globally coupled spiking neurons and derive an exact low-dimensional description of their collective dynamics in the thermodynamic limit. Individual neurons are described by the Ermentrout-Kopell canonical model that can be excitable or tonically spiking, and interact with other neurons via pulses. Utilizing the equivalence of the quadratic integrate- and-fire and the theta neuron formulations, we first derive the dynamical equations in terms of the Kuramoto-Daido order parameters (Fourier modes of the phase distribution) and relate them to two biophysically relevant macroscopic observables, the firing rate and the mean voltage. For neurons driven by Cauchy white noise or for Cauchy-Lorentz distributed input currents, we adapt the results by Cestnik and Pikovsky [arXiv:2207.02302 (2022)] and show that for arbitrary initial conditions the collective dynamics reduces to six dimensions. We also prove that in this case the dynamics asymptotically converges to a two-dimensional invariant manifold first discovered by Ott and Antonsen. For identical, noise-free neurons, the dynamics reduces to three dimensions, becoming equivalent to the Watanabe-Strogatz description. We illustrate the exact six-dimensional dynamics outside the invariant manifold by calculating nontrivial basins of different asymptotic regimes in a bistable situation.Comment: 14 pages, 3 figure

    Exact firing rate model reveals the differential effects of chemical versus electrical synapses in spiking networks

    Get PDF
    Chemical and electrical synapses shape the dynamics of neuronal networks. Numerous theoretical studies have investigated how each of these types of synapses contributes to the generation of neuronal oscillations, but their combined effect is less understood. This limitation is further magnified by the impossibility of traditional neuronal mean-field models—also known as firing rate models or firing rate equations—to account for electrical synapses. Here, we introduce a firing rate model that exactly describes the mean-field dynamics of heterogeneous populations of quadratic integrate-and-fire (QIF) neurons with both chemical and electrical synapses. The mathematical analysis of the firing rate model reveals a well-established bifurcation scenario for networks with chemical synapses, characterized by a codimension-2 cusp point and persistent states for strong recurrent excitatory coupling. The inclusion of electrical coupling generally implies neuronal synchrony by virtue of a supercritical Hopf bifurcation. This transforms the cusp scenario into a bifurcation scenario characterized by three codimension-2 points (cusp, Takens-Bogdanov, and saddle-node separatrix loop), which greatly reduces the possibility for persistent states. This is generic for heterogeneous QIF networks with both chemical and electrical couplings. Our results agree with several numerical studies on the dynamics of large networks of heterogeneous spiking neurons with electrical and chemical couplings

    Network dynamics of coupled oscillators and phase reduction techniques

    No full text
    Investigating the dynamics of a network of oscillatory systems is a timely and urgent topic. Phase synchronization has proven paradigmatic to study emergent collective behavior within a network. Defining the phase dynamics, however, is not a trivial task. The literature provides an arsenal of solutions, but results are scattered and their formulation is far from standardized. Here, we present, in a unified language, a catalogue of popular techniques for deriving the phase dynamics of coupled oscillators. Traditionally, approaches to phase reduction address the (weakly) perturbed dynamics of an oscillator. They fall into three classes. (i) Many phase reduction techniques start off with a Hopf normal form description, thereby providing mathematical rigor. There, the caveat is to first derive the proper normal form. We explicate several ways to do that, both analytically and (semi-)numerically. (ii) Other analytic techniques capitalize on time scale separation and/or averaging over cyclic variables. While appealing for their more intuitive implementation, they often lack accuracy. (iii) Direct numerical approaches help to identify oscillatory behavior but may limit an overarching view how the reduced phase dynamics depends on model parameters. After illustrating and reviewing the necessary mathematical details for single oscillators, we turn to networks of coupled oscillators as the central issue of this report. We show in detail how the concepts of phase reduction for single oscillators can be extended and applied to oscillator networks. Again, we distinguish between numerical and analytic phase reduction techniques. As the latter dwell on a network normal form, we also discuss associated reduction methods. To illustrate benefits and pitfalls of the different phase reduction techniques, we apply them point-by-point to two classic examples: networks of Brusselators and a more elaborate model of coupled Wilson-Cowan oscillators. The reduction of complex oscillatory systems is crucial for numerical analyses but more so for analytical estimates and model prediction. The most common reduction is towards phase oscillator networks that have proven successful in describing not only the transition between incoherence and global synchronization, but also in predicting the existence of less trivial network states. Many of these predictions have been confirmed in experiments. As we show, however, the phase dynamics depends to large extent on the employed phase reduction technique. In view of current and future trends, we also provide an overview of various methods for augmented phase reduction as well as for phase-amplitude reduction. We indicate how these techniques can be extended to oscillator networks and, hence, may allow for an improved derivation of the phase dynamics of coupled oscillators. (C) 2019 The Author(s). Published by Elsevier B.V
    corecore